Boosting Classifiers Built from Different Subsets of Features
نویسندگان
چکیده
منابع مشابه
Boosting Classifiers Built from Different Subsets of Features
We focus on the adaptation of boosting to representation spaces composed of different subsets of features. Rather than imposing a single weak learner to handle data that could come from different sources (e.g., images and texts and sounds), we suggest the decomposition of the learning task into several dependent sub-problems of boosting, treated by different weak learners, that will optimally c...
متن کاملBoosting Classifiers Regionally
This paper presents a new algorithm for Boosting the performance of an ensemble of classifiers. In Boosting, a series of classifiers is used to predict the class of data where later members of the series concentrate on training data that is incorrectly predicted by earlier members. To make a prediction about a new pattern, each classifier predicts the class of the pattern and these predictions ...
متن کاملBoosting recombined weak classifiers
Boosting is a set of methods for the construction of classifier ensembles. The differential feature of these methods is that they allow to obtain a strong classifier from the combination of weak classifiers. Therefore, it is possible to use boosting methods with very simple base classifiers. One of the most simple classifiers are decision stumps, decision trees with only one decision node. This...
متن کاملBoosting classifiers for drifting concepts
This paper proposes a boosting-like method to train a classifier ensemble from data streams. It naturally adapts to concept drift and allows to quantify the drift in terms of its base learners. The algorithm is empirically shown to outperform learning algorithms that ignore concept drift. It performs no worse than advanced adaptive time window and example selection strategies that store all the...
متن کاملMulticlass Boosting for Weak Classifiers
AdaBoost.M2 is a boosting algorithm designed for multiclass problems with weak base classifiers. The algorithm is designed to minimize a very loose bound on the training error. We propose two alternative boosting algorithms which also minimize bounds on performance measures. These performance measures are not as strongly connected to the expected error as the training error, but the derived bou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Fundamenta Informaticae
سال: 2009
ISSN: 0169-2968
DOI: 10.3233/fi-2009-169